Variance reduction of estimators arising from Metropolis-Hastings algorithms

نویسندگان

  • George Iliopoulos
  • Sonia Malefaki
چکیده

The Metropolis–Hastings algorithm is one of the most basic and well-studied Markov chain Monte Carlo methods. It generates a Markov chain which has as limit distribution the target distribution by simulating observations from a different proposal distribution. A proposed value is accepted with some particular probability otherwise the previous value is repeated. As a consequence, the accepted values are repeated a positive number of times and thus any resulting ergodic mean is, in fact, a weighted average. It turns out that this weighted average is an importance sampling-type estimator with random weights. By the standard theory of importance sampling, replacement of these random weights by their (conditional) expectations leads to more efficient estimators. In this paper we study the estimator arising by replacing the random weights with certain estimators of their conditional expectations. We illustrate by simulations that it is often more efficient than the original estimator while in the case of the independence Metropolis–Hastings and for distributions with finite support we formally prove that it is even better than the “optimal” importance sampling estimator.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximating Bayes Estimates by Means of the Tierney Kadane, Importance Sampling and Metropolis-Hastings within Gibbs Methods in the Poisson-Exponential Distribution: A Comparative Study

Here, we work on the problem of point estimation of the parameters of the Poisson-exponential distribution through the Bayesian and maximum likelihood methods based on complete samples. The point Bayes estimates under the symmetric squared error loss (SEL) function are approximated using three methods, namely the Tierney Kadane approximation method, the importance sampling method and the Metrop...

متن کامل

Norges Teknisk-naturvitenskapelige Universitet Control Variates for the Metropolis-hastings Algorithm Control Variates for the Metropolis-hastings Algorithm

We propose new control variates for variance reduction in the Metropolis–Hastings algorithm. We use variates that are functions of both the current state of the Markov chain and the proposed new state. This enable us to specify control variates which have known mean values for general target and proposal distributions. We develop the ideas for both the standard Metropolis–Hastings algorithm and...

متن کامل

Over-relaxation Methods and Metropolis-hastings Coupled Markov Chains for Monte Carlo Simulation

This paper is concerned with improving the performance of Markov chain algorithms for Monte Carlo simulation. We propose a new algorithm for simulating from multivariate Gaussian densities. This algorithm combines ideas from Metropolis-coupled Markov chain Monte Carlo methods and from an existing algorithm based only on over-relaxation. The speed of convergence of the proposed and existing algo...

متن کامل

Variance Bounding Markov Chains by Gareth

We introduce a new property of Markov chains, called variance bounding. We prove that, for reversible chains at least, variance bounding is weaker than, but closely related to, geometric ergodicity. Furthermore, variance bounding is equivalent to the existence of usual central limit theorems for all L functionals. Also, variance bounding (unlike geometric ergodicity) is preserved under the Pesk...

متن کامل

Bridge estimation of the probability density at a point

to estimate the value taken by a probability density at a point in the state space. When the normalisation of the prior density is known, this value may be used to estimate a Bayes factor. It is shown that the multi-block Metropolis-Hastings estimators of Chib and Jeliazkov (2001) are bridge sampling estimators. This identification leads to estimators for the quantity of interest which may be s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Statistics and Computing

دوره 23  شماره 

صفحات  -

تاریخ انتشار 2013